5 research outputs found

    Mobile Robot Position Determination

    Get PDF

    4 Mobile Robot Position Determination

    Get PDF

    Sensory-based robot control for automatic manipulations

    No full text
    Intelligent robots require sensor-based control to perform complex operations and to deal with uncertainty in the environment. Feedback from the work environment can be from different sensors for example force/torque, visual, and tactile sensors. Many applications of robot manipulators require that the control is applied not only to the position of the gripper but also to the force exerted by the tool on the object. An adaptive controller has been designed to control the position and contact force in the Cartesian coordinate system. A successful simulation and lab experimentation illustrate the applicability of the approach. Vision systems usually provide more global information about the environment than force/torque sensors. Visual feedback represents a typical sensing system in which camera images provide feedback information, for example, for grasping a moving object with a robot manipulator. Because image processing is time-consuming, information about the target position will be delayed and not available instantaneously for the controller. Therefore, the present and future position has to be predicted in real-time. Since the dynamics of the target is assumed to be unknown, prediction of the object position will be accomplished using a model such as an auto-regressive discrete-time model. The predicted values and current end-effector position determine the desired trajectory point (subgoal) for the motion. The planner adapts to changes in the target position on-line. The desired trajectory is tracked by the end-effector, which is controlled by a self-tuner. A simulation study and lab experiments are presented to demonstrate the grasping of a moving target by a manipulator by means of a visual feedback. Contact sensing can be measured by force/torque and tactile sensors. Tactile sensors measure forces at specific points between the object and tactile pads. A specific task of placing a parallelepiped object on an unknown flat surface using information from force/torque and tactile sensors is considered. Three types of contact (point, line, and plane) can occur between the object and the surface. The type of contact is determined based on the information obtained from the force/torque sensor with the assumption that the object shape is known. The manipulator serving strategy depends on the type of contact. During line contact, the angle between the object base and the contact plane surface is calculated using the information received from the tactile sensor. The desired final position and orientation of the end-effector which places the object on the plane surface is determined based on the current end-effector position and orientation, and the obtained angle. Experiments were performed that successfully demonstrate the approach by calculating the point and type of contact, and the angle between the object base and the flat surface

    Mars Umbilical Technology Demonstrator

    No full text
    The objective of this project is to develop a autonomous umbilical mating for the mars umbilical technology demonstrator. The Mars Umbilical Technology Demonstrator (MUTD) shall provide electrical power and fiber optic data cable connections between two simulated mars vehicles. The Omnibot is used to provide the mobile base for the system. The mate to umbilical plate is mounted on a three axis Cartesian table, which is installed on the Omnibot mobile base. The Omnibot is controlled in a teleoperated mode. The operator using the vision system will guide the Omnibot to get close to the mate to plate. The information received from four ultrasonic sensors is used to identify the position of mate to plate and mate the umbilical plates autonomously. A successful experimentation verifies the approach
    corecore